Modulation of Visually Evoked Postural Responses by Contextual Visual, Haptic and Auditory Information: A ‘Virtual Reality Check’
نویسندگان
چکیده
Externally generated visual motion signals can cause the illusion of self-motion in space (vection) and corresponding visually evoked postural responses (VEPR). These VEPRs are not simple responses to optokinetic stimulation, but are modulated by the configuration of the environment. The aim of this paper is to explore what factors modulate VEPRs in a high quality virtual reality (VR) environment where real and virtual foreground objects served as static visual, auditory and haptic reference points. Data from four experiments on visually evoked postural responses show that: 1) visually evoked postural sway in the lateral direction is modulated by the presence of static anchor points that can be haptic, visual and auditory reference signals; 2) real objects and their matching virtual reality representations as visual anchors have different effects on postural sway; 3) visual motion in the anterior-posterior plane induces robust postural responses that are not modulated by the presence of reference signals or the reality of objects that can serve as visual anchors in the scene. We conclude that automatic postural responses for laterally moving visual stimuli are strongly influenced by the configuration and interpretation of the environment and draw on multisensory representations. Different postural responses were observed for real and virtual visual reference objects. On the basis that automatic visually evoked postural responses in high fidelity virtual environments should mimic those seen in real situations we propose to use the observed effect as a robust objective test for presence and fidelity in VR.
منابع مشابه
Advantages of haptic feedback in virtual reality supported balance training: a pilot study
Repetitive and goal based task supported with virtual reality technology have proven successful in balance training of stroke population. However, adding a haptic experience can besides increasing the difficulty level of the task enable postural responses assessment. We demonstrated in a single subject with stroke that haptic feedback can be used not only for interaction with virtual environmen...
متن کاملComparison of the Wave Amplitude of Visually Evoked Potential in Amblyopic Eyes between Patients with Esotropia and Anisometropia and a Normal Group
Background: We compared the wave amplitude of visually evoked potential (VEP) between patients with esotropic and anisometropic amblyopic eyes and a normal group.Methods: The wave amplitude of VEP was documented in 2 groups of persons with amblyopia (15 with esotropia and 28 with anisometropia) and 1 group of individuals with normal visual acuity (n, 15). The amplitude of P100 was recorded mono...
متن کاملEnabling People with Visual Impairments to Navigate Virtual Reality with a Haptic and Auditory Cane Simulation
Traditional virtual reality (VR) mainly focuses on visual feedback, which is not accessible for people with visual impairments. We created Canetroller, a haptic cane controller that simulates white cane interactions, enabling people with visual impairments to navigate a virtual environment by transferring their cane skills into the virtual world. Canetroller provides three types of feedback: (1...
متن کاملA Review on Applications of Haptic Systems, Virtual Reality, and Artificial Intelligence in Medical Training in COVID-19 Pandemic
This paper presents a survey on haptic technology, virtual reality, and artificial intelligence applications in medical training during the COVID-19 pandemic. Over the last few decades, there has been a great deal of interest in using new technologies to establish capable approaches for medical training purposes. These methods are intended to minimize surgerychr('39')s adverse effects, mostly w...
متن کاملThe sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perception
Recent magneto-encephalographic and electro-encephalographic studies provide evidence for cross-modal integration during audio-visual and audio-haptic speech perception, with speech gestures viewed or felt from manual tactile contact with the speaker's face. Given the temporal precedence of the haptic and visual signals on the acoustic signal in these studies, the observed modulation of N1/P2 a...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره 8 شماره
صفحات -
تاریخ انتشار 2013